updated 2024-09-20
Version 2409? (Build 16.0.17928.20148)
| // Kurtosis | |
| // Computes the rolling excess kurtosis of log returns and visualises the | |
| // resulting probability density function (Gram‑Charlier Type A expansion) | |
| // alongside a standard Gaussian reference curve. | |
| //@version=6 | |
| indicator("Kurtosis", overlay = false, | |
| max_polylines_count = 100, max_labels_count = 50) | |
| // ─────────────────────────── Inputs ─────────────────────────── |
Terminals should generate the 256-color palette from the user's base16 theme.
If you've spent much time in the terminal, you've probably set a custom base16 theme. They work well. You define a handful of colors in one place and all your programs use them.
The drawback is that 16 colors is limiting. Complex and color-heavy programs struggle with such a small palette.
See more of my writing here.
In this post, I'll start from scratch and build up to OpenClaw's architecture step by step, showing how you could have invented it yourself from first principles, using nothing but a messaging API, an LLM, and the desire to make AI actually useful outside the chat window.
End goal: understand how persistent AI assistants work, so you can build your own (or become an OpenClaw power user).
When you use ChatGPT or Claude in a browser, there are several limitations:
| // See image comparison https://imgur.com/a/9L2P7GJ | |
| // Read details https://iolite-engine.com/blog_posts/minimal_agx_implementation | |
| // Usage: | |
| // 1. Open "Project Settings" and change "Working Color Space" to "sRGB / Rec709" | |
| // 2. Open `Engine\Shaders\Private\PostProcessTonemap.usf` file | |
| // 3. Find `half3 OutDeviceColor = ColorLookupTable(FinalLinearColor);` line | |
| // 4. Replace it with `half3 OutDeviceColor = ApplyAgX(FinalLinearColor);` line | |
| // 5. Find `half3 ColorLookupTable( half3 LinearColor )` function | |
| // 6. After the scope of the function, add the code below and run `RecompileShaders Changed` from console |
| """ | |
| The most atomic way to train and run inference for a GPT in pure, dependency-free Python. | |
| This file is the complete algorithm. | |
| Everything else is just efficiency. | |
| @karpathy | |
| """ | |
| import os # os.path.exists | |
| import math # math.log, math.exp |